Recursive random variables with subgaussian distributions
نویسنده
چکیده
with K, n0 ≥ 1, (X n )n≥0 identically distributed as (Xn)n≥0 for r = 1, . . . , K , a random vector I (n) = (I (n) 1 , . . . , I (n) K ) of integers in {0, . . . , n − 1} and a random bn such that (X n )n≥0, . . . , (X ) n )n≥0, (I (n), bn) are independent. The symbol d = denotes equality in distribution. In applications, the I (n) r are random subgroup sizes, bn is a toll function specifying the particular quantity of a combinatorial structure and (X n )n≥0 are copies of the quantity (Xn)n≥0, that correspond to the contribution of subgroup r. Typical parameters Xn range from the depths, sizes and path lengths of trees, the number of various sub-structures or components of combinatorial structures, the number
منابع مشابه
Some Probability Inequalities for Quadratic Forms of Negatively Dependent Subgaussian Random Variables
In this paper, we obtain the upper exponential bounds for the tail probabilities of the quadratic forms for negatively dependent subgaussian random variables. In particular the law of iterated logarithm for quadratic forms of independent subgaussian random variables is generalized to the case of negatively dependent subgaussian random variables.
متن کاملNon-Asymptotic Theory of Random Matrices
1 Definition The topic in this lecture is Subgaussian random variables. We start with the definition, and discuss some properties they hold. Definition 1 (Subgaussian random variables). A random variable X is subgaussian if ∃c, C such that P(|x| > t) ≤ Ce −ct 2 ∀t ≥ 0. (1) As the name suggests, the notion of subgaussian random variables is a generalization of Gaussian random variables. Both the...
متن کاملConvergence of series of dependent φ-subgaussian random variables
The almost sure convergence of weighted sums of φ-subgaussian m-acceptable random variables is investigated. As corollaries, the main results are applied to the case of negatively dependent and m-dependent subgaussian random variables. Finally, an application to random Fourier series is presented. © 2007 Elsevier Inc. All rights reserved.
متن کاملLecture 5 — September 8 , 2016
1 Overview In the last lecture we took a more in depth look at Chernoff Bounds and introduced subgaussian and subexponential variables. In this lecture we will continue talking about subgaussian variables and related random variables – subexponential and subgamma, and finally we will give a proof of famous Johnson-Lindenstrauss lemma using property of subgaussian/subgamma variables. Definition ...
متن کاملSparse Hanson-Wright inequalities for subgaussian quadratic forms
In this paper, we provide a proof for the Hanson-Wright inequalities for sparse quadratic forms in subgaussian random variables. This provides useful concentration inequalities for sparse subgaussian random vectors in two ways. Let X = (X1, . . . , Xm) ∈ R be a random vector with independent subgaussian components, and ξ = (ξ1, . . . , ξm) ∈ {0, 1} be independent Bernoulli random variables. We ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005